Bregman distances in exponential families of probability measures
نویسندگان
چکیده
The concept of Bregman distances for Euclidean space vectors was introduced by Bregman (1967) in the context of convex programming. In this setup, the Bregman method has been widely applied and adapted, especially for the design of regularization algorithms for nding a good approximate solution of inverse problems e.g. in image processing (tomography etc.), see for instance Censor and Lent (1981), Eggermont (1993), Byrne (1999), Resmerita (2005), Silva Neto and Cella (2006), Resmerita and Scherzer (2007), Resmerita and Anderssen (2007), Xu and Osher (2007), Burger et al. (2008), Cai et al. (2008), Marquina and Osher (2008), Osher et al. (2008), Scherzer et al. (2008), and the references therein. Bregman distances for non-negative functions were treated in Csiszár (1995). In the context of information theory and statistical decision theory, Bregman distances were studied e.g. by Csiszár (1991, 1994) as well as Pardo and Vajda (1997, 2003) basically for discrete probability measures or related functional quantities; closely related contexts are also applied in machine learning, see e.g. in La¤erty et al. (1997), Kivinen and Warmuth (1999), La¤erty (1999), Collins et al. (2001), Della Pietra et al. (2002), Murata et al. (2004), as well as in Cesa-Bianchi and Lugosi (2006). Applications to statistical physics are e.g. given in Topsoe (2007).
منابع مشابه
Applications of Information Geometry to Audio Signal Processing
In this talk, we present some applications of information geometry to audio signal processing. We seek a comprehensive framework that allows to quantify, process and represent the information contained in audio signals. In digital audio, a sound signal is generally encoded as a waveform, and a common problematic is to extract relevant information about the signal by computing sound features fro...
متن کاملMonte Carlo Information Geometry: The dually flat case
Exponential families and mixture families are parametric probability models that can be geometrically studied as smooth statistical manifolds with respect to any statistical divergence like the KullbackLeibler (KL) divergence or the Hellinger divergence. When equipping a statistical manifold with the KL divergence, the induced manifold structure is dually flat, and the KL divergence between dis...
متن کاملSimplification and hierarchical representations of mixtures of exponential families
A mixture model in statistics is a powerful framework commonly used to estimate the probability measure function of a random variable. Most algorithms handling mixture models were originally specifically designed for processing mixtures of Gaussians. However, other distributions such as Poisson, multinomial, Gamma/Beta have gained interest in signal processing in the past decades. These common ...
متن کاملTesting a Point Null Hypothesis against One-Sided for Non Regular and Exponential Families: The Reconcilability Condition to P-values and Posterior Probability
In this paper, the reconcilability between the P-value and the posterior probability in testing a point null hypothesis against the one-sided hypothesis is considered. Two essential families, non regular and exponential family of distributions, are studied. It was shown in a non regular family of distributions; in some cases, it is possible to find a prior distribution function under which P-va...
متن کاملWorst-Case and Smoothed Analysis of the k-Means Method with Bregman Divergences
The k-means algorithm is the method of choice for clustering large-scale data sets and it performs exceedingly well in practice despite its exponential worst-case running-time. To narrow the gap between theory and practice, k-means has been studied in the semi-random input model of smoothed analysis, which often leads to more realistic conclusions than mere worst-case analysis. For the case tha...
متن کامل